Optimization by moving ridge functions: derivative-free optimization for computationally intensive functions

نویسندگان

چکیده

A novel derivative-free algorithm, called optimization by moving ridge functions (OMoRF), for unconstrained and bound-constrained is presented. This algorithm couples trust region methodologies with output-based dimension reduction to accelerate convergence of model-based strategies. The dimension-reducing subspace updated as the moves through function domain, allowing OMoRF be applied no known global low-dimensional structure. Furthermore, its low computational requirement allows it make rapid progress when optimizing high-dimensional functions. Its performance examined on a set test problems moderate high design problem. results show that compares favourably other common methods, even in which underlying structure known.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Randomized Derivative-Free Optimization of Noisy Convex Functions∗

We propose STARS, a randomized derivative-free algorithm for unconstrained optimization when the function evaluations are contaminated with random noise. STARS takes dynamic, noise-adjusted smoothing stepsizes that minimize the least-squares error between the true directional derivative of a noisy function and its finite difference approximation. We provide a convergence rate analysis of STARS ...

متن کامل

Derivative-Free Optimization of High-Dimensional Non-Convex Functions by Sequential Random Embeddings

Derivative-free optimization methods are suitable for sophisticated optimization problems, while are hard to scale to high dimensionality (e.g., larger than 1,000). Previously, the random embedding technique has been shown successful for solving high-dimensional problems with low effective dimensions. However, it is unrealistic to assume a low effective dimension in many applications. This pape...

متن کامل

Derivative-Free Optimization

In many engineering applications it is common to find optimization problems where the cost function and/or constraints require complex simulations. Though it is often, but not always, theoretically possible in these cases to extract derivative information efficiently, the associated implementation procedures are typically non-trivial and time-consuming (e.g., adjoint-based methodologies). Deriv...

متن کامل

Derivative free optimization method

Derivative free optimization (DFO) methods are typically designed to solve optimization problems whose objective function is computed by a “black box”; hence, the gradient computation is unavailable. Each call to the “black box” is often expensive, so estimating derivatives by finite differences may be prohibitively costly. Finally, the objective function value may be computed with some noise, ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Engineering Optimization

سال: 2021

ISSN: ['1029-0273', '0305-215X', '1026-745X']

DOI: https://doi.org/10.1080/0305215x.2021.1886286